1. Parsing Phase: 

There are basically three steps involved in processing the code:

  • Parsing the code
  • Compiling the code
  • Executing the code

TOKENS -> SYNTAX PARSER -> AST

Code is broken down into its respective tokens.
A token is the smallest unit of a program that is meaningful to the compiler or interpreter

Example: const sum = 5 + 7
const is a token, sum is a token,
= is a token
5 is a token, + is a token 7 is a token

Tokens are given to SYNTAX PARSER
SYNTAX PARSER converts tokens into ABSTRACT SYNTAX TREE (AST)

2. Compilation Phase

Uses Just in Time (JIT) Compilation

The V8 engine initially uses IGNITION INTERPRETER, to interpret the code.
On further executions, the V8 engine finds patterns such as

  • frequently executed functions,
  • frequently used variables,
  • and uses TURBOFAN COMPILER to compile them to improve performance.

V8 engine uses both - INTERPRETER and COMPILER

The V8 engine uses both

  1. an interpreter - IGNITION INTERPRETER

  2. a compiler - TURBOFAN COMPILER

    ==**IGNITION INTERPRETER**==
    - input: `ABSTRACT SYNTAX TREE`
    - output: `BYTE CODE
    
    **TURBOFAN COMPILER**
    - input: `BYTE CODE` from ==**ignition interpreter**==
               `FEEDBACK` from ==**ignition interpreter**==
    - output: optimized `MACHINE CODE
    

    `

Example

Suppose the performance degrades or the parameters passed to the function change their type,
then the V8 simply decompiles the compiled code and falls back to the interpreter.

Example:
If the compiler compiles a function
-> assuming the data fetched from the API call is of type String,
-> the code fails when the data received is of type object.
In this case,
the compiler de-compiles the code,
falls back to the interpreter, and updates the feedback.

3. Execution Phase

The BYTE CODE is executed by using the Memory heap and the Call Stack of the V8 engine’s runtime environment. 
This is how your code are stored in the memory

  • Memory Heap is the place where all the variables and functions are assigned memory. 
  • Call Stack is the place where each individual functions, when called are pushed to the stack, and popped out after their execution.
    Call Stack vs Memory Heap

Not Done Reading

When the interpreter interprets the code, using an object structure, where
the keys are the byte code
the values the functions which handle the corresponding byte code.

The V8 engine orders the values in the form of a list in memory, which is saved into a Map thereby saving a lot of memory. 

Example: 

let Person = {name: "GeeksforGeeks"}
Person.age = 20;

In the above example, a map holds the Person object, which has the property name.
The second line creates a new object with the property age and links it back to the Person object.
The problem with the above approach is that it takes linear time to search through the linked lists.
To combat the problem, V8 has provided us with Inline Cache(IC).

 Inline Cache: 
Inline Cache is a data structure used to keep track of the addresses of the properties on objects, thereby reducing the lookup time.
It tracks all the LOAD, STORE, and CALL events within a function, by maintaining a Feedback Vector.
Feedback Vector is simply an array used to track all the Inline Caches of a particular function.

Example : 

const sum = (a, b) => {
    return a+b;
}

For the above example, the IC is :
[{ slot: 0, icType: LOAD, value: UNINIT}]
Here, the function has
one IC
with type LOAD
value UNINIT,
which means that the function has not yet been initialized.

Upon Calling the function :

sum(5, 10)
sum(5, "GeeksForGeeks")

On the first call, the IC changes to:
[{ slot: 0, icType: LOAD, value: MONO(I) }]
Here the code is interpreted in a manner,
where the arguments passed are only of integer type. i.e the function will only work for integer values.

On the second call, the IC changes to:
[{ slot: 0, icType: LOAD, value: POLY[I,S] }]
Here the code is interpreted in a manner, where the arguments passed can either be of integer type or string. i.e the function will work for both integers as well as strings.
Thereby, the running time of the function is faster if the type of arguments received is not modified.

Inline caches keep track of how frequently they’re used and provide necessary feedback to the Turbofan compiler.


Created: 2024-02-12